seo

Calculating Estimated ROI for a Specific Site & Body of Keywords

One of the biggest challenges for SEO is proving its worth. We all know it’s valuable, but it’s important to convey its value in terms that key stakeholders (up to and including CEOs) understand. To do that, I put together a process to calculate an estimate of ROI for implementing changes to keyword targeting.

In this post, I will walk through that process, so hopefully you can do the same for your clients (or as an in-house SEO to get buy-in), too!

Overview

  1. Gather your data
    1. Keyword Data
    2. Strength of your Preferred URLs
    3. Competition URLs by Keyword
    4. Strength of Competition URLs
  2. Analyze the Data by Keyword
  3. Calculate your potential opportunity

What you need

There are quite a few parts to this recipe, and while the calculation part is pretty easy, gathering the data to throw in the mix is the challenging part. I’ll list each section here, including the components of each, and then we can go through how to retrieve each of them. 

  • Keyword data

    • list of keywords
    • search volumes for each keyword
    • preferred URLs on the site you’re estimating ROI
    • current rank
    • current ranking URL
  • Strength of your preferred URLs

    • De-duplicated list of preferred URLs
    • Page Authorities for each preferred URL
    • BONUS: External & Internal Links for each URL. You can include any measure you like here, as long as it’s something that can be compared (i.e. a number).
  • Where the competition sits

    • For each keyword, the sites that are ranking 1-10 in search currently
  • Strength of the competition

    • De-duplicated list of competing URLs
    • Page Authorities, Domain Authorities, 
    • BONUS: External & Internal Links, for each competing URL. Include any measure you’ve included on the Strength of Your Preferred URLs list.


How to get what you need


There has been quite a lot written about keyword research, so I won’t go into too much detail here. For the Keyword data list, the important thing is to get whatever keywords you’d like to assess into a spreadsheet, and include all the information listed above. You’ll have to select the preferred URLs based on what you think the strongest-competing and most appropriate URL would be for each keyword. 


For the
Preferred URLs list, you’ll want to use the data that’s in your keyword data under the preferred URL.

  1. Copy the preferred URL data from your Keyword Data into a new tab. 
  2. Use the Remove Duplicates tool (Data>Data Tools in Excel) to remove any duplicated URLs

Once you have the list of de-duplicated preferred URLs, you’ll need to pull the data from Open Site Explorer for these URLs. I prefer using the Moz API with SEOTools. You’ll have to install it to use it for Excel, or if you’d like to take a stab at using it in Google Docs, there are some resources available for that. Unfortunately, with the most recent update to Google Spreadsheets, I’ve had some difficulty with this method, so I’ve gone with Excel for now. 

Once you’ve got SEOTools installed, you can make the call “=MOZ_URLMetrics_toFit([enter your cells])”. This should give you a list of URL titles, canonical URLs, External & Internal links, as well as a few other metrics and DA/PA. 


For the
Where the competition sits list, you’ll first need to perform a search for each of your keywords. Obviously, you could do this manually, or if you have exportable data from a keyword ranking tool and you’ve been ranking the keywords you’d like to look at, you could use either of these methods. If you don’t have those, you can use the hacky method that I did–basically, use the ImportXML command in Google Spreadsheets to grab the top ranking URLs for each query. 

I’ve put a sample version of this together, which you can access here. A few caveats: you should be able to run MANY searches in a row–I had about 850 for my data, and they ran fine. Google will block your IP address, though, if you run too many, and what I found is that I needed to copy out my results as values into a different spreadsheet once I’d gotten them, because they timed out relatively quickly, but you can just put them into the Excel spreadsheet you’re building to make the ROI calculations (you’ll need them there anyway!).


From this list, you can pull each URL into a single list, and de-duplicate as explained for the preferred URLs list to generate the
Strength of the Competition list, and then run the analysis you did with the preferred URLs to generate the same data for these URLs as you did for the preferred URLs with SEOTools for Excel. 


Making your data work for you

Once you’ve got these lists, you can use some VLOOKUP magic to pull in the information you need. I used the
Where the competition sits list as the foundation of my work. 

From there, I pulled in the corresponding preferred URL and its Page Authority, as well as the PAs and DAs for each URL currently ranking 1-10. I then was able to calculate an average PA & DA for each query, and could compare the page I want to rank to this. I estimated the chances that the page I wanted to rank (given that I’d already determined these were relevant pages) could rank with better keyword targeting.

Here’s where things get interesting. You can be rather conservative, and only sum search volumes of keywords you’re fairly confident your site can rank, which is my preferred method. That’s because I use this method primarily to determine if I’m on the right track–whether making these recommendations are really worth the time to get implemented. So I’m going to move forward assuming I’m counting only the search volumes of terms I think I’m quite competitive for, AND that I’m not yet ranking for on page 1. 


Now, you want to move to your analytics data in order to calculate a few things: 

  • Conversion Rate
  • Average order value
  • Previous year’s revenue (for the section you’re looking at)

I’ve set up my sample data in this spreadsheet that you can refer to or use to make your own calculations. 

Each of the assumptions can be adjusted depending on the actual site data, or using estimates. I’m using very very generic overall CTR estimates, but you can select which you’d like and get as granular as you want! The main point for me is really getting to two numbers that I can stand by as pretty good estimates: 

  • Annual Impact (Revenue $$)
  • Increase in Revenue ($$) from last year

This is because, for higher-up folks, money talks. Obviously, this won’t be something you can promise, but it gives them a metric that they understand to really wrap their head around the value that you’re potentially brining to the table if the changes you’re recommending can be made. 

There are some great tools for estimating this kind of stuff on a smaller scale, but for a massive body of keyword data, hopefully you will find this process useful as well. Let me know what you think, and I’d love to see what parts anyone else can streamline or make even more efficient. 

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button